60 research outputs found

    Analysis of the contour structural irregularity of skin lesions using wavelet decomposition

    Get PDF
    The boundary irregularity of skin lesions is of clinical significance for the early detection of malignant melanomas and to distinguish them from other lesions such as benign moles. The structural components of the contour are of particular importance. To extract the structure from the contour, wavelet decomposition was used as these components tend to locate in the lower frequency sub-bands. Lesion contours were modeled as signatures with scale normalization to give position and frequency resolution invariance. Energy distributions among different wavelet sub-bands were then analyzed to extract those with significant levels and differences to enable maximum discrimination. Based on the coefficients in the significant sub-bands, structural components from the original contours were modeled, and a set of statistical and geometric irregularity descriptors researched that were applied at each of the significant sub-bands. The effectiveness of the descriptors was measured using the Hausdorff distance between sets of data from melanoma and mole contours. The best descriptor outputs were input to a back projection neural network to construct a combined classifier system. Experimental results showed that thirteen features from four sub-bands produced the best discrimination between sets of melanomas and moles, and that a small training set of nine melanomas and nine moles was optimum

    Visual inspection : image sampling, algorithms and architectures

    Get PDF
    The thesis concerns the hexagonal sampling of images, the processing of industrially derived images, and the design of a novel processor element that can be assembled into pipelines to effect fast, economic and reliable processing. A hexagonally sampled two dimensional image can require 13.4% fewer sampling points than a square sampled equivalent. The grid symmetry results in simpler processing operators that compute more efficiently than square grid operators. Computation savings approaching 44% arc demonstrated. New hexagonal operators arc reported including a Gaussian smoothing filter, a binary thinner, and an edge detector with comparable accuracy to that of the Sobel detector. The design of hexagonal arrays of sensors is considered. Operators requiring small local areas of support are shown to be sufficient for processing controlled lighting and industrial images. Case studies show that small features in hexagonally processed images maintain their shape better, and that processes can tolerate a lower signal to noise ratio, than that for equivalent square processed images. The modelling of small defects in surfaces has been studied in depth. The flexible programmable processor element can perform the low level local operators required for industrial image processing on both square and hexagonal grids. The element has been specified and simulated by a high level computer program. A fast communication channel allows for dynamic reprogramming by a control computer, and the video rate element can be assembled into various pipeline architectures, that may eventually be adaptively controlled

    Teaching early numeracy to students with autism using a school staff delivery model

    Get PDF
    Mathematics is one of the core school subjects in the UK and an emphasis is placed on developing pupils’ mathematical competencies throughout all key stages. Despite that, the attainment of students with disabilities in mathematics remains low. The current study explored ways in which the Teaching Early Numeracy to children with Developmental Disabilities (TEN-DD) programme could be implemented by teaching staff in a special school in the UK to improve the numeracy skills of students with autism. Adaptations to the delivery of the programme were made during the study as a result of continued collaboration with the participating school. The findings suggest that it may be feasible to implement the TEN-DD programme using a school staff delivery model and it may help learners improve their early numeracy skills. Practical aspects of TEN-DD’s implementation highlighted the need to incorporate more systematic adaptations for minimally verbal students, as well as for learners who might need additional training with prerequisite skills

    An Integrated Approach to the Prediction of Chemotherapeutic Response in Patients with Breast Cancer

    Get PDF
    BACKGROUND: A major challenge in oncology is the selection of the most effective chemotherapeutic agents for individual patients, while the administration of ineffective chemotherapy increases mortality and decreases quality of life in cancer patients. This emphasizes the need to evaluate every patient's probability of responding to each chemotherapeutic agent and limiting the agents used to those most likely to be effective. METHODS AND RESULTS: Using gene expression data on the NCI-60 and corresponding drug sensitivity, mRNA and microRNA profiles were developed representing sensitivity to individual chemotherapeutic agents. The mRNA signatures were tested in an independent cohort of 133 breast cancer patients treated with the TFAC (paclitaxel, 5-fluorouracil, adriamycin, and cyclophosphamide) chemotherapy regimen. To further dissect the biology of resistance, we applied signatures of oncogenic pathway activation and performed hierarchical clustering. We then used mRNA signatures of chemotherapy sensitivity to identify alternative therapeutics for patients resistant to TFAC. Profiles from mRNA and microRNA expression data represent distinct biologic mechanisms of resistance to common cytotoxic agents. The individual mRNA signatures were validated in an independent dataset of breast tumors (P = 0.002, NPV = 82%). When the accuracy of the signatures was analyzed based on molecular variables, the predictive ability was found to be greater in basal-like than non basal-like patients (P = 0.03 and P = 0.06). Samples from patients with co-activated Myc and E2F represented the cohort with the lowest percentage (8%) of responders. Using mRNA signatures of sensitivity to other cytotoxic agents, we predict that TFAC non-responders are more likely to be sensitive to docetaxel (P = 0.04), representing a viable alternative therapy. CONCLUSIONS: Our results suggest that the optimal strategy for chemotherapy sensitivity prediction integrates molecular variables such as ER and HER2 status with corresponding microRNA and mRNA expression profiles. Importantly, we also present evidence to support the concept that analysis of molecular variables can present a rational strategy to identifying alternative therapeutic opportunities

    Sooty Mangabey Genome Sequence Provides Insight into AIDS Resistance in a Natural SIV Host

    Get PDF
    In contrast to infections with human immunodeficiency virus (HIV) in humans and simian immunodeficiency virus (SIV) in macaques, SIV infection of a natural host, sooty mangabeys (Cercocebus atys), is non-pathogenic despite high viraemia. Here we sequenced and assembled the genome of a captive sooty mangabey. We conducted genome-wide comparative analyses of transcript assemblies from C. atys and AIDS-susceptible species, such as humans and macaques, to identify candidates for host genetic factors that influence susceptibility. We identified several immune-related genes in the genome of C. atys that show substantial sequence divergence from macaques or humans. One of these sequence divergences, a C-terminal frameshift in the toll-like receptor-4 (TLR4) gene of C. atys, is associated with a blunted in vitro response to TLR-4 ligands. In addition, we found a major structural change in exons 3-4 of the immune-regulatory protein intercellular adhesion molecule 2 (ICAM-2); expression of this variant leads to reduced cell surface expression of ICAM-2. These data provide a resource for comparative genomic studies of HIV and/or SIV pathogenesis and may help to elucidate the mechanisms by which SIV-infected sooty mangabeys avoid AIDS

    A Multilaboratory Comparison of Calibration Accuracy and the Performance of External References in Analytical Ultracentrifugation

    Get PDF
    Analytical ultracentrifugation (AUC) is a first principles based method to determine absolute sedimentation coefficients and buoyant molar masses of macromolecules and their complexes, reporting on their size and shape in free solution. The purpose of this multi-laboratory study was to establish the precision and accuracy of basic data dimensions in AUC and validate previously proposed calibration techniques. Three kits of AUC cell assemblies containing radial and temperature calibration tools and a bovine serum albumin (BSA) reference sample were shared among 67 laboratories, generating 129 comprehensive data sets. These allowed for an assessment of many parameters of instrument performance, including accuracy of the reported scan time after the start of centrifugation, the accuracy of the temperature calibration, and the accuracy of the radial magnification. The range of sedimentation coefficients obtained for BSA monomer in different instruments and using different optical systems was from 3.655 S to 4.949 S, with a mean and standard deviation of (4.304 ± 0.188) S (4.4%). After the combined application of correction factors derived from the external calibration references for elapsed time, scan velocity, temperature, and radial magnification, the range of s-values was reduced 7-fold with a mean of 4.325 S and a 6-fold reduced standard deviation of ± 0.030 S (0.7%). In addition, the large data set provided an opportunity to determine the instrument-to-instrument variation of the absolute radial positions reported in the scan files, the precision of photometric or refractometric signal magnitudes, and the precision of the calculated apparent molar mass of BSA monomer and the fraction of BSA dimers. These results highlight the necessity and effectiveness of independent calibration of basic AUC data dimensions for reliable quantitative studies

    A multilaboratory comparison of calibration accuracy and the performance of external references in analytical ultracentrifugation.

    Get PDF
    Analytical ultracentrifugation (AUC) is a first principles based method to determine absolute sedimentation coefficients and buoyant molar masses of macromolecules and their complexes, reporting on their size and shape in free solution. The purpose of this multi-laboratory study was to establish the precision and accuracy of basic data dimensions in AUC and validate previously proposed calibration techniques. Three kits of AUC cell assemblies containing radial and temperature calibration tools and a bovine serum albumin (BSA) reference sample were shared among 67 laboratories, generating 129 comprehensive data sets. These allowed for an assessment of many parameters of instrument performance, including accuracy of the reported scan time after the start of centrifugation, the accuracy of the temperature calibration, and the accuracy of the radial magnification. The range of sedimentation coefficients obtained for BSA monomer in different instruments and using different optical systems was from 3.655 S to 4.949 S, with a mean and standard deviation of (4.304 ± 0.188) S (4.4%). After the combined application of correction factors derived from the external calibration references for elapsed time, scan velocity, temperature, and radial magnification, the range of s-values was reduced 7-fold with a mean of 4.325 S and a 6-fold reduced standard deviation of ± 0.030 S (0.7%). In addition, the large data set provided an opportunity to determine the instrument-to-instrument variation of the absolute radial positions reported in the scan files, the precision of photometric or refractometric signal magnitudes, and the precision of the calculated apparent molar mass of BSA monomer and the fraction of BSA dimers. These results highlight the necessity and effectiveness of independent calibration of basic AUC data dimensions for reliable quantitative studies

    Measurement of the point-spread function of a noisy imaging system

    Get PDF
    The averaged point-spread function (PSF) estimation of an image acquisition system is important for many computer vision applications, including edge detection and depth from defocus. The paper compares several mathematical models of the PSF and presents an improved measurement technique that enables subpixel estimation of 2D functions. New methods for noise suppression and uneven illumination modeling were incorporated. The PSF was computed from an ensemble of edge-spread function measurements. The generalized Gaussian was shown to be an 8 times better fit to the estimated PSF than the Gaussian and a 14 times better fit than the pillbox model

    Rational filter design for depth from defocus

    Get PDF
    The paper describes a new, simple procedure to determine the rational filters that are used in the depth from defocus (DfD) procedure previously researched by Watanabe and Nayar [4]. Their DfD uses two differently defocused images and the filters accurately model the relative defocus in the images and provide a fast calculation of distance. This paper presents a simple method to determine the filter coefficients by separating the M/P ratio into a linear and a cubic error correction model. The method avoids the previous iterative minimisation technique and computes efficiently. The model has been verified by comparison with the theoretical M/P ratio. The proposed filters have been compared with the previous for frequency response, closeness of fit to M/P, rotational symmetry, and measurement accuracy. Experiments were performed for several defocus conditions. It was observed that the new filters were largely insensitive to object texture and modelled the blur more precisely than the previous. Experiments with real planar images demonstrated a maximum RMS depth error of 1.18% for the proposed, compared to 1.54% for the previous filters. Complicated objects were also accurately measured
    • …
    corecore